Accelerating alternating least squares for tensor decomposition by pairwise perturbation
نویسندگان
چکیده
The alternating least squares (ALS) algorithm for CP and Tucker decomposition is dominated in cost by the tensor contractions necessary to set up quadratic optimization subproblems. We introduce a novel family of algorithms that uses perturbative corrections subproblems rather than recomputing contractions. This approximation accurate when factor matrices are changing little across iterations, which occurs ALS approaches convergence. provide theoretical analysis bound error. Our numerical experiments demonstrate proposed pairwise perturbation easy control converge minima as good ALS. experimental results show improvements 3.1 × with respect state-of-the-art various model problems real datasets.
منابع مشابه
Some Convergence Results on the Regularized Alternating Least-Squares Method for Tensor Decomposition
We study the convergence of the Regularized Alternating Least-Squares algorithm for tensor decompositions. As a main result, we have shown that given the existence of critical points of the Alternating Least-Squares method, the limit points of the converging subsequences of the RALS are the critical points of the least squares cost functional. Some numerical examples indicate a faster convergen...
متن کاملDMS: Distributed Sparse Tensor Factorization with Alternating Least Squares
Tensors are data structures indexed along three or more dimensions. Tensors have found increasing use in domains such as data mining and recommender systems where dimensions can have enormous length and are resultingly very sparse. The canonical polyadic decomposition (CPD) is a popular tensor factorization for discovering latent features and is most commonly found via the method of alternating...
متن کاملTensor Decompositions, Alternating Least Squares and other Tales
This work was originally motivated by a classification of tensors proposed by Richard Harshman. In particular, we focus on simple and multiple “bottlenecks”, and on “swamps”. Existing theoretical results are surveyed, some numerical algorithms are described in details, and their numerical complexity is calculated. In particular, the interest in using the ELS enhancement in these algorithms is d...
متن کاملIrreducibility of Tensor Squares, Symmetric Squares and Alternating Squares
We investigate the question when the tensor square, the alternating square, or the symmetric square of an absolutely irreducible projective representation V of an almost simple group G is again irreducible. The knowledge of such representations is of importance in the description of the maximal subgroups of simple classical groups of Lie type. We show that if G is of Lie type in odd characteris...
متن کاملRegularized Alternating Least Squares Algorithms for Non-negative Matrix/Tensor Factorization
Nonnegative Matrix and Tensor Factorization (NMF/NTF) and Sparse Component Analysis (SCA) have already found many potential applications, especially in multi-way Blind Source Separation (BSS), multi-dimensional data analysis, model reduction and sparse signal/image representations. In this paper we propose a family of the modified Regularized Alternating Least Squares (RALS) algorithms for NMF/...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Numerical Linear Algebra With Applications
سال: 2022
ISSN: ['1070-5325', '1099-1506']
DOI: https://doi.org/10.1002/nla.2431